33 research outputs found

    On estimating the hurst parameter from least-squares residuals. Case study: Correlated terrestrial laser scanner range noise

    Get PDF
    Many signals appear fractal and have self-similarity over a large range of their power spectral densities. They can be described by so-called Hermite processes, among which the first order one is called fractional Brownian motion (fBm), and has a wide range of applications. The fractional Gaussian noise (fGn) series is the successive differences between elements of a fBm series; they are stationary and completely characterized by two parameters: the variance, and the Hurst coefficient (H). From physical considerations, the fGn could be used to model the noise of observations coming from sensors working with, e.g., phase differences: due to the high recording rate, temporal correlations are expected to have long range dependency (LRD), decaying hyperbolically rather than exponentially. For the rigorous testing of deformations detected with terrestrial laser scanners (TLS), the correct determination of the correlation structure of the observations is mandatory. In this study, we show that the residuals from surface approximations with regression B-splines from simulated TLS data allow the estimation of the Hurst parameter of a known correlated input noise. We derive a simple procedure to filter the residuals in the presence of additional white noise or low frequencies. Our methodology can be applied to any kind of residuals, where the presence of additional noise and/or biases due to short samples or inaccurate functional modeling make the estimation of the Hurst coefficient with usual methods, such as maximum likelihood estimators, imprecise. We demonstrate the feasibility of our proposal with real observations from a white plate scanned by a TLS

    How to account for temporal correlations with a diagonal correlation model in a nonlinear functional model : A plane fitting with simulated and real TLS measurements

    Get PDF
    To avoid computational burden, diagonal variance covariance matrices (VCM) are preferred to describe the stochasticity of terrestrial laser scanner (TLS) measurements. This simplification neglects correlations and affects least-squares (LS) estimates that are trustworthy with minimal variance, if the correct stochastic model is used. When a linearization of the LS functional model is performed, a bias of the parameters to be estimated and their dispersions occur, which can be investigated using a second-order Taylor expansion. Both the computation of the second-order solution and the account for correlations are linked to computational burden. In this contribution, we study the impact of an enhanced stochastic model on that bias to weight the corresponding benefits against the improvements. To that aim, we model the temporal correlations of TLS measurements using the Matérn covariance function, combined with an intensity model for the variance. We study further how the scanning configuration influences the solution. Because neglecting correlations may be tempting to avoid VCM inversions and multiplications, we quantify the impact of such a reduction and propose an innovative yet simple way to account for correlations with a “diagonal VCM.” Originally developed for GPS measurements and linear LS, this model is extended and validated for TLS range and called the diagonal correlation model (DCM). © 2020, The Author(s)

    The Atmospheric Scale Lengths of Turbulence and Its Dependencies Derived from GPS Single Difference with a Common Clock

    Get PDF
    Microwave signals, for example, those from Global Navigation Satellite Systems (GNSS) and very long baseline interferometry, are affected by tropospheric turbulence in such a way that the random fluctuations of the atmospheric index of refraction correlate the phase measurements. These atmospheric correlations are an important error source in space geodetic techniques. For computational reasons, they are neglected in positioning applications, to the detriment of a trustworthy description of the precision, and rigorous test statistics. Fortunately, modelling such correlations is possible by combining concepts from electromagnetic wave propagation in a random medium and the Kolmogorov turbulence theory. In this contribution, we will process single differences GNSS phase observations from a 300 m baseline between two different receivers linked to a common clock. After a preprocessing to filter additional error contributions, such as multipath, we will study the power spectral density of the phase residuals. We will estimate its low and high cutoff frequencies with an adapted unbiased Whittle maximum likelihood estimator. These cutoff frequencies – as predicted by turbulence theory – are related directly to the scale lengths of turbulence, i.e. the size of the eddies that correlate the GNSS observations. The study of their dependencies with the satellite geometry, day of the year, or time of the day provides new insights into the two- and three-dimensional atmospheric turbulence in the atmosphere. In addition, it contributes to improving the stochastic description of random effects impacting GNSS phase observations

    Characterization of the optical encoder angular noise from terrestrial laser scanners

    Get PDF
    Rigorous statistical testing of deformation using a terrestrial laser scanner (TLS) can avoid events such as structure collapses. Such a procedure necessitates an accurate description of the TLS measurements’ noise, which should include the correlations between angles. Unfortunately, these correlations are often unaccounted for due to a lack of knowledge. This contribution addresses this challenge. We combine (i) a least-square approximation to extract the geometry of the TLS point cloud with the aim to analyze the residuals of the fitting and (ii) a specific filtering coupled with a maximum likelihood estimation to quantify the amount of flicker noise versus white noise. This allows us to set up fully populated variance covariance matrices of the TLS noise as a result

    Deformation analysis using B-spline surface with correlated terrestrial laser scanner observations-a bridge under load

    Get PDF
    The choice of an appropriate metric is mandatory to perform deformation analysis between two point clouds (PC)-the distance has to be trustworthy and, simultaneously, robust against measurement noise, which may be correlated and heteroscedastic. The Hausdorff distance (HD) or its averaged derivation (AHD) are widely used to compute local distances between two PC and are implemented in nearly all commercial software. Unfortunately, they are affected by measurement noise, particularly when correlations are present. In this contribution, we focus on terrestrial laser scanner (TLS) observations and assess the impact of neglecting correlations on the distance computation when a mathematical approximation is performed. The results of the simulations are extended to real observations from a bridge under load. Highly accurate laser tracker (LT) measurements were available for this experiment: they allow the comparison of the HD and AHD between two raw PC or between their mathematical approximations regarding reference values. Based on these results, we determine which distance is better suited in the case of heteroscedastic and correlated TLS observations for local deformation analysis. Finally, we set up a novel bootstrap testing procedure for this distance when the PC are approximated with B-spline surfaces

    Using Least-Squares Residuals to Assess the Stochasticity of Measurements—Example: Terrestrial Laser Scanner and Surface Modeling

    Get PDF
    Terrestrial laser scanners (TLS) capture a large number of 3D points rapidly, with high precision and spatial resolution. These scanners are used for applications as diverse as modeling architectural or engineering structures, but also high-resolution mapping of terrain. The noise of the observations cannot be assumed to be strictly corresponding to white noise: besides being heteroscedastic, correlations between observations are likely to appear due to the high scanning rate. Unfortunately, if the variance can sometimes be modeled based on physical or empirical considerations, the latter are more often neglected. Trustworthy knowledge is, however, mandatory to avoid the overestimation of the precision of the point cloud and, potentially, the non-detection of deformation between scans recorded at different epochs using statistical testing strategies. The TLS point clouds can be approximated with parametric surfaces, such as planes, using the Gauss–Helmert model, or the newly introduced T-splines surfaces. In both cases, the goal is to minimize the squared distance between the observations and the approximated surfaces in order to estimate parameters, such as normal vector or control points. In this contribution, we will show how the residuals of the surface approximation can be used to derive the correlation structure of the noise of the observations. We will estimate the correlation parameters using the Whittle maximum likelihood and use comparable simulations and real data to validate our methodology. Using the least-squares adjustment as a “filter of the geometry” paves the way for the determination of a correlation model for many sensors recording 3D point clouds

    Fitting Terrestrial Laser Scanner Point Clouds with T-Splines: Local Refinement Strategy for Rigid Body Motion

    Get PDF
    T-splines have recently been introduced to represent objects of arbitrary shapes using a smaller number of control points than the conventional non-uniform rational B-splines (NURBS) or B-spline representatizons in computer-aided design, computer graphics and reverse engineering. They are flexible in representing complex surface shapes and economic in terms of parameters as they enable local refinement. This property is a great advantage when dense, scattered and noisy point clouds are approximated using least squares fitting, such as those from a terrestrial laser scanner (TLS). Unfortunately, when it comes to assessing the goodness of fit of the surface approximation with a real dataset, only a noisy point cloud can be approximated: (i) a low root mean squared error (RMSE) can be linked with an overfitting, i.e., a fitting of the noise, and should be correspondingly avoided, and (ii) a high RMSE is synonymous with a lack of details. To address the challenge of judging the approximation, the reference surface should be entirely known: this can be solved by printing a mathematically defined T-splines reference surface in three dimensions (3D) and modeling the artefacts induced by the 3D printing. Once scanned under different configurations, it is possible to assess the goodness of fit of the approximation for a noisy and potentially gappy point cloud and compare it with the traditional but less flexible NURBS. The advantages of T-splines local refinement open the door for further applications within a geodetic context such as rigorous statistical testing of deformation. Two different scans from a slightly deformed object were approximated; we found that more than 40% of the computational time could be saved without affecting the goodness of fit of the surface approximation by using the same mesh for the two epochs

    On the impact of correlations on the congruence test: a bootstrap approach: Case study: B-spline surface fitting from TLS observations

    Get PDF
    The detection of deformation is one of the major tasks in surveying engineering. It is meaningful only if the statistical significance of the distortions is correctly investigated, which often underlies a parametric modelization of the object under consideration. So-called regression B-spline approximation can be performed for point clouds of terrestrial laser scanners, allowing the setting of a specific congruence test based on the B-spline surfaces. Such tests are known to be strongly influenced by the underlying stochastic model chosen for the observation errors. The latter has to be correctly specified, which includes accounting for heteroscedasticity and correlations. In this contribution, we justify and make use of a parametric correlation model called the Matérn model to approximate the variance covariance matrix (VCM) of the residuals by performing their empirical mode decomposition. The VCM obtained is integrated into the computation of the congruence test statistics for a more trustworthy test decision. Using a real case study, we estimate the distribution of the test statistics with a bootstrap approach, where no parametric assumptions are made about the underlying population that generated the random sample. This procedure allows us to assess the impact of neglecting correlations on the critical value of the congruence test, highlighting their importance

    Optimal Surface Fitting of Point Clouds Using Local Refinement

    Get PDF
    This open access book provides insights into the novel Locally Refined B-spline (LR B-spline) surface format, which is suited for representing terrain and seabed data in a compact way. It provides an alternative to the well know raster and triangulated surface representations. An LR B-spline surface has an overall smooth behavior and allows the modeling of local details with only a limited growth in data volume. In regions where many data points belong to the same smooth area, LR B-splines allow a very lean representation of the shape by locally adapting the resolution of the spline space to the size and local shape variations of the region. The iterative method can be modified to improve the accuracy in particular domains of a point cloud. The use of statistical information criterion can help determining the optimal threshold, the number of iterations to perform as well as some parameters of the underlying mathematical functions (degree of the splines, parameter representation). The resulting surfaces are well suited for analysis and computing secondary information such as contour curves and minimum and maximum points. Also deformation analysis are potential applications of fitting point clouds with LR B-splines.publishedVersio

    Classification of Terrestrial Laser Scanner Point Clouds: A Comparison of Methods for Landslide Monitoring from Mathematical Surface Approximation

    Get PDF
    Terrestrial laser scanners (TLS) are contact-free measuring sensors that record dense point clouds of objects or scenes by acquiring coordinates and an intensity value for each point. The point clouds are scattered and noisy. Performing a mathematical surface approximation instead of working directly on the point cloud is an efficient way to reduce the data storage and structure the point clouds by transforming “data” to “information”. Applications include rigorous statistical testing for deformation analysis within the context of landslide monitoring. In order to reach an optimal approximation, classification and segmentation algorithms can identify and remove inhomogeneous structures, such as trees or bushes, to obtain a smooth and accurate mathematical surface of the ground. In this contribution, we compare methods to perform the classification of TLS point clouds with the aim of guiding the reader through the existing algorithms. Besides the traditional point cloud filtering methods, we will analyze machine learning classification algorithms based on the manual extraction of point cloud features, and a deep learning approach with automatic extraction of features called PointNet++. We have intentionally chosen strategies easy to implement and understand so that our results are reproducible for similar point clouds. We show that each method has advantages and drawbacks, depending on user criteria, such as the computational time, the classification accuracy needed, whether manual extraction is performed or not, and if prior information is required. We highlight that filtering methods are advantageous for the application at hand and perform a mathematical surface approximation as an illustration. Accordingly, we have chosen locally refined B-splines, which were shown to provide an optimal and computationally manageable approximation of TLS point clouds
    corecore